Search results for "dimension reduction"
showing 10 items of 15 documents
Dimension Estimation in Two-Dimensional PCA
2021
We propose an automated way of determining the optimal number of low-rank components in dimension reduction of image data. The method is based on the combination of two-dimensional principal component analysis and an augmentation estimator proposed recently in the literature. Intuitively, the main idea is to combine a scree plot with information extracted from the eigenvectors of a variation matrix. Simulation studies show that the method provides accurate estimates and a demonstration with a finger data set showcases its performance in practice. peerReviewed
Data-driven analysis for fMRI during naturalistic music listening
2017
Interest towards higher ecological validity in functional magnetic resonance imaging (fMRI) experiments has been steadily growing since the turn of millennium. The trend is reflected in increasing amount of naturalistic experiments, where participants are exposed to the real-world complex stimulus and/or cognitive tasks such as watching movie, playing video games, or listening to music. Multifaceted stimuli forming parallel streams of input information, combined with reduced control over experimental variables introduces number of methodological challenges associated with isolating brain responses to individual events. This exploratory work demonstrated some of those methodological challeng…
Efficient unsupervised clustering for spatial bird population analysis along the Loire river
2015
International audience; This paper focuses on application and comparison of Non Linear Dimensionality Reduction (NLDR) methods on natural high dimensional bird communities dataset along the Loire River (France). In this context, biologists usually use the well-known PCA in order to explain the upstream-downstream gradient.Unfortunately this method was unsuccessful on this kind of nonlinear dataset.The goal of this paper is to compare recent NLDR methods coupled with different data transformations in order to find out the best approach. Results show that Multiscale Jensen-Shannon Embedding (Ms JSE) outperform all over methods in this context.
Dimension reduction for time series in a blind source separation context using r
2021
Funding Information: The work of KN was supported by the CRoNoS COST Action IC1408 and the Austrian Science Fund P31881-N32. The work of ST was supported by the CRoNoS COST Action IC1408. The work of JV was supported by Academy of Finland (grant 321883). We would like to thank the anonymous reviewers for their comments which improved the paper and package considerably. Publisher Copyright: © 2021, American Statistical Association. All rights reserved. Multivariate time series observations are increasingly common in multiple fields of science but the complex dependencies of such data often translate into intractable models with large number of parameters. An alternative is given by first red…
3D-2D dimensional reduction for a nonlinear optimal design problem with perimeter penalization
2012
A 3D-2D dimension reduction for a nonlinear optimal design problem with a perimeter penalization is performed in the realm of $\Gamma$-convergence, providing an integral representation for the limit functional.
Signal dimension estimation in BSS models with serial dependence
2022
Many modern multivariate time series datasets contain a large amount of noise, and the first step of the data analysis is to separate the noise channels from the signals of interest. A crucial part of this dimension reduction is determining the number of signals. In this paper we approach this problem by considering a noisy latent variable time series model which comprises many popular blind source separation models. We propose a general framework for the estimation of the signal dimension that is based on testing for sub-sphericity and give examples of different tests suitable for time series settings. In the inference we rely on bootstrap null distributions. Several simulation studies are…
Combining PCA and multiset CCA for dimension reduction when group ICA is applied to decompose naturalistic fMRI data
2015
An extension of group independent component analysis (GICA) is introduced, where multi-set canonical correlation analysis (MCCA) is combined with principal component analysis (PCA) for three-stage dimension reduction. The method is applied on naturalistic functional MRI (fMRI) images acquired during task-free continuous music listening experiment, and the results are compared with the outcome of the conventional GICA. The extended GICA resulted slightly faster ICA convergence and, more interestingly, extracted more stimulus-related components than its conventional counterpart. Therefore, we think the extension is beneficial enhancement for GICA, especially when applied to challenging fMRI d…
Dimensional reduction for energies with linear growth involving the bending moment
2008
A $\Gamma$-convergence analysis is used to perform a 3D-2D dimension reduction of variational problems with linear growth. The adopted scaling gives rise to a nonlinear membrane model which, because of the presence of higher order external loadings inducing a bending moment, may depend on the average in the transverse direction of a Cosserat vector field, as well as on the deformation of the mid-plane. The assumption of linear growth on the energy leads to an asymptotic analysis in the spaces of measures and of functions with bounded variation.
On the usage of joint diagonalization in multivariate statistics
2022
Scatter matrices generalize the covariance matrix and are useful in many multivariate data analysis methods, including well-known principal component analysis (PCA), which is based on the diagonalization of the covariance matrix. The simultaneous diagonalization of two or more scatter matrices goes beyond PCA and is used more and more often. In this paper, we offer an overview of many methods that are based on a joint diagonalization. These methods range from the unsupervised context with invariant coordinate selection and blind source separation, which includes independent component analysis, to the supervised context with discriminant analysis and sliced inverse regression. They also enco…
Linear Feature Extraction for Ranking
2018
We address the feature extraction problem for document ranking in information retrieval. We then propose LifeRank, a Linear feature extraction algorithm for Ranking. In LifeRank, we regard each document collection for ranking as a matrix, referred to as the original matrix. We try to optimize a transformation matrix, so that a new matrix (dataset) can be generated as the product of the original matrix and a transformation matrix. The transformation matrix projects high-dimensional document vectors into lower dimensions. Theoretically, there could be very large transformation matrices, each leading to a new generated matrix. In LifeRank, we produce a transformation matrix so that the generat…